Mini-Batch Primal and Dual Methods for SVMs
نویسندگان
چکیده
We address the issue of using mini-batches in stochastic optimization of SVMs. We show that the same quantity, the spectral norm of the data, controls the parallelization speedup obtained for both primal stochastic subgradient descent (SGD) and stochastic dual coordinate ascent (SCDA) methods and use it to derive novel variants of mini-batched SDCA. Our guarantees for both methods are expressed in terms of the original nonsmooth primal problem based on the hinge-loss.
منابع مشابه
Primal-dual path-following algorithms for circular programming
Circular programming problems are a new class of convex optimization problems that include second-order cone programming problems as a special case. Alizadeh and Goldfarb [Math. Program. Ser. A 95 (2003) 3-51] introduced primal-dual path-following algorithms for solving second-order cone programming problems. In this paper, we generalize their work by using the machinery of Euclidean Jordan alg...
متن کاملBeating SGD: Learning SVMs in Sublinear Time
We present an optimization approach for linear SVMs based on a stochasticprimal-dual approach, where the primal step is akin to an importance-weightedSGD, and the dual step is a stochastic update on the importance weights. Thisyields an optimization method with a sublinear dependence on the training setsize, and the first method for learning linear SVMs with runtime less the...
متن کاملPrimal-dual Optimization Methods in Neural Networks and Support Vector Machines Training
Recently a lot of attention has been given to applications of mathematical programming to machine learning and neural networks. In this tutorial we investigate the use of Interior Point Methods (IPMs) to Support Vector Machines (SVMs) and Arti cial Neural Networks (ANNs) training. The training of ANNs is a highly nonconvex optimization problem in contrast to the SVMs training problem which is a...
متن کاملIncremental support vector machine learning in the primal and applications
Most algorithms of support vector machines (SVMs) operate in a batch mode. However, when the samples arrive sequentially, batch implementations of SVMs are computationally demanding due to the fact that they must be retrained from scratch. This paper proposes an incremental SVM algorithm that is suitable for the problems of sequentially arriving samples. Unlike previous SVM techniques, this new...
متن کاملRandomized Dual Coordinate Ascent with Arbitrary Sampling
We study the problem of minimizing the average of a large number of smooth convex functions penalized with a strongly convex regularizer. We propose and analyze a novel primal-dual method (Quartz) which at every iteration samples and updates a random subset of the dual variables, chosen according to an arbitrary distribution. In contrast to typical analysis, we directly bound the decrease of th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013